Block-Term Tensor Decomposition Model Selection and Computation: The Bayesian Way
نویسندگان
چکیده
The so-called block-term decomposition (BTD) tensor model, especially in its rank-$(L_r,L_r,1)$ version, has been recently receiving increasing attention due to enhanced ability of representing systems and signals that are composed \emph{blocks} rank higher than one, a scenario encountered numerous diverse applications. Uniqueness conditions fitting methods have thus thoroughly studied. Nevertheless, the challenging problem estimating BTD model structure, namely number block terms, $R$, their individual ranks, $L_r$, only started attract significant attention, mainly through regularization-based approaches which entail need tune regularization parameter(s). In this work, we build on ideas sparse Bayesian learning (SBL) put forward fully automated approach. Through suitably crafted multi-level \emph{hierarchical} probabilistic gives rise heavy-tailed prior distributions for factors, structured sparsity is \emph{jointly} imposed. Ranks then estimated from numbers blocks ($R$) columns ($L_r$) non-negligible energy. Approximate posterior inference implemented, within variational framework. resulting iterative algorithm completely avoids hyperparameter tuning, defect methods. Alternative models also explored connections with counterparts brought light aid associated maximum a-posteriori (MAP) estimators. We report simulation results both synthetic real-word data, demonstrate merits proposed method terms estimation as compared state-of-the-art relevant
منابع مشابه
Block-Decoupling Multivariate Polynomials Using the Tensor Block-Term Decomposition
We present a tensor-based method to decompose a given set of multivariate functions into linear combinations of a set of multivariate functions of linear forms of the input variables. The method proceeds by forming a three-way array (tensor) by stacking Jacobian matrix evaluations of the function behind each other. It is shown that a blockterm decomposition of this tensor provides the necessary...
متن کاملBayesian computation and model selection without likelihoods.
Until recently, the use of Bayesian inference was limited to a few cases because for many realistic probability models the likelihood function cannot be calculated analytically. The situation changed with the advent of likelihood-free inference algorithms, often subsumed under the term approximate Bayesian computation (ABC). A key innovation was the use of a postsampling regression adjustment, ...
متن کاملLearning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition
Recurrent Neural Networks (RNNs) are powerful sequence modeling tools. However, when dealing with high dimensional inputs, the training of RNNs becomes computational expensive due to the large number of model parameters. This hinders RNNs from solving many important computer vision tasks, such as Action Recognition in Videos and Image Captioning. To overcome this problem, we propose a compact a...
متن کاملBayesian Computation and Model Selection in Population Genetics
Until recently, the use of Bayesian inference in population genetics was limited to a few cases because for many realistic population genetic models the likelihood function cannot be calculated analytically . The situation changed with the advent of likelihood-free inference algorithms, often subsumed under the term Approximate Bayesian Computation (ABC). A key innovation was the use of a post-...
متن کاملDeviance Information Criteria for Model Selection in Approximate Bayesian Computation
Approximate Bayesian computation (ABC) is a class of algorithmic methods in Bayesian inference using statistical summaries and computer simulations. ABC has become popular in evolutionary genetics and in other branches of biology. However model selection under ABC algorithms has been a subject of intense debate during the recent years. Here we propose novel approaches to model selection based o...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2022
ISSN: ['1053-587X', '1941-0476']
DOI: https://doi.org/10.1109/tsp.2022.3159029